What is an "AI doomer"?
“AI doomer” is a label for someone who is concerned about AI-caused extinction (or comparable AI catastrophes) — often used pejoratively, by people who take a dim view of such concerns.
It’s worth distinguishing a few kinds of views that might cause someone to be called an “AI doomer”:
- “AI-caused extinction is plausible enough to pay attention to and to make tradeoffs to prevent.” This view is held by many AI researchers, engineers, policy analysts, safety researchers, and heads of AI labs, as illustrated by the CAIS statement on AI risk.
- “AI-caused extinction is highly probable.” People with this view generally think that although we will probably fail to avoid extinction, it’s worth trying. This includes Eliezer Yudkowsky and others at MIRI.
- “AI-caused extinction is inevitable and not even worth trying to prevent.” This sense of “doomer” is often used in other contexts, like climate change, but it’s uncommon for people to be fatalistic about AI risk in this way.
Outside the context of existential risk, the term is occasionally applied to ideas as varied as “AI will doom artists to unemployment” and even “AI is doomed to stay at a low capability level.” And outside the context of AI, “doomer” can refer to anyone who is pessimistic about technological progress (like a Luddite), or the future in general. This results in further confusion because many “AI doomers” are optimistic about progress, but carve out an exception for AI.
Fixing the Ambiguity
While there are no alternatives to “AI doomer” that have gained traction, one attempt at better-defined terms comes from Rob Bensinger:
- “AGI risk fractionalists”: p(doom) < 2%
- “AGI-wary”: p(doom) around 2-20%
- “AGI-alarmed”: p(doom) around 20-80%
- “AGI-grim”: p(doom) > 80%
Bensinger also suggests terms like “AGI variabilists” and “AGI eventualists” for different preferred policies about whether and when to build AGI. The policy question is partly separate from that of how likely AI is to cause extinction, but the term “doomer” tends to conflate them.
Further reading:
- Some reasons not to say “Doomer” by Ruby
- Ajeya Cotra, Shane Legg and others discussing the relevance of the term